228 research outputs found

    Faster universal modeling for two source classes

    Get PDF
    The Universal Modeling algorithms proposed in [2] for two general classes of finite-context sources are reviewed. The above methods were constructed by viewing a model structure as a partition of the context space and realizing that a partition can be reached through successive splits. Here we start by constructing recursive counting algorithms to count all models belonging to the two classes and use the algorithms to perform the Bayesian Mixture. The resulting methods lead to computationally more efficient Universal Modeling algorithms

    Privacy leakage in biometric secrecy systems

    Get PDF
    Motivated by Maurer [1993], Ahlswede and Csiszar [1993] introduced the concept of secret sharing. In their source model two terminals observe two correlated sequences. It is the objective of both terminals to form a common secret by interchanging a public message (helper data), that should contain only a negligible amount of information about the secret. Ahlswede and Csiszar showed that the maximum secret key rate that can be achieved in this way is equal to the mutual information between the two source outputs. In a biometric setting, where the sequences correspond to the enrollment and authentication data, it is crucial that the public message leaks as little information as possible about the biometric data, since compromised biometric data cannot be replaced. We investigate the fundamental trade-offs for four biometric settings. The first one is the standard (Ahlswede-Csiszar) secret generation setting, for which we determine the secret key rate - privacy leakage region. Here leakage corresponds to the mutual information between helper data and biometric enrollment sequence conditional on the secret. In the second setting the secret is not generated by the terminals but independently chosen, and transmitted using a public message. Again we determine the region of achievable rate - leakage pairs. In setting three and four we consider zero-leakage, i.e. the public message contains only a negligible amount of information about the secret and the biometric enrollment sequence. To achieve this a private key is needed which can be observed only by the terminals. We consider again both secret generation and secret transmission and determine for both cases the region of achievable secret key rate - private key rate pairs

    On the security of XOR-method in biometric authentication systems

    Get PDF
    A biometric authentication system can be partitioned into a layer that extracts common randomness out of a pair of related biometric sequences [Maurer, 1993] and a layer that masks a key sequence with this common randomness. We will analyze the performance of such a layered system first, and will show that an alternative method, the XOR- echnique, is not always secure

    Complexity reduction of the context-tree weighting algorithm : a study for KPN Research

    Get PDF

    Capacity and Codes for Embedding Information in Gray-Scale Signals

    Full text link

    A-posteriori symbol probabilities and log-likelihood ratios for coherently detected π/4-DE-QPSK

    Get PDF
    In this letter, coherent detection of p/4-DE-QPSK is considered, but our analysis also holds for common DE-QPSK. It is shown that maximum a-posteriori (MAP) sequence detection can be regarded as an approximation, based on selecting dominant exponentials, of MAP symbol detection. A better approximation, relying on piecewise-linear fitting of the logarithm of the hyperbolic cosine, is proposed. This approximation results in a performance very close to optimal symbol detection. For the case where the symbols are produced by convolutional encoding and Gray mapping, the log-likelihood ratios are investigated. Again a simple approximation based on selecting dominant exponentials and an improved approximation relying on piecewise-linear fits, are discussed. As in the uncoded case the improved approximation gives a performance quite close to ideal. While the particular examples considered show modest gains in performance, this letter provides a way of improving performance when needed

    Universal Noiseless Compression for Noisy Data

    Full text link
    We study universal compression for discrete data sequences that were corrupted by noise. We show that while, as expected, there exist many cases in which the entropy of these sequences increases from that of the original data, somewhat surprisingly and counter-intuitively, universal coding redundancy of such sequences cannot increase compared to the original data. We derive conditions that guarantee that this redundancy does not decrease asymptotically (in first order) from the original sequence redundancy in the stationary memoryless case. We then provide bounds on the redundancy for coding finite length (large) noisy blocks generated by stationary memoryless sources and corrupted by some speci??c memoryless channels. Finally, we propose a sequential probability estimation method that can be used to compress binary data corrupted by some noisy channel. While there is much benefit in using this method in compressing short blocks of noise corrupted data, the new method is more general and allows sequential compression of binary sequences for which the probability of a bit is known to be limited within any given interval (not necessarily between 0 and 1). Additionally, this method has many different applications, including, prediction, sequential channel estimation, and others

    Low Complexity Sequential Probability Estimation and Universal Compression for Binary Sequences with Constrained Distributions

    Get PDF
    Two low-complexity methods are proposed for sequential probability assignment for binary independent and identically distributed (i.i.d.) individual sequences with empirical distributions whose governing parameters are known to be bounded within a limited interval. The methods can be applied to different problems where fast accurate estimation of the maximizing sequence probability is very essential to minimizing some loss. Such applications include applications in finance, learning, channel estimation and decoding, prediction, and universal compression. The application of the new methods to universal compression is studied, and their universal coding redundancies are analyzed. One of the methods is shown to achieve the minimax redundancy within the inner region of the limited parameter interval. The other method achieves better performance on the region boundaries and is more robust numerically to outliers. Simulation results support the analysis of both methods. While non-asymptotically the gains may be significant over standard methods that maximize the probability over the complete parameter simplex, asymptotic gains are in second order. However, these gains translate to meaningful significant factor gains in other applications, such as financial ones. Moreover, the methods proposed generate estimators that are constrained within a given interval throughout the complete estimation process which are essential to applications such as sequential binary channel crossover estimation. The results for the binary case lay the foundation to studying larger alphabets

    Quantization effects in biometric systems

    Full text link
    The fundamental secret-key rate vs. privacy-leakage rate trade-offs for secret-key generation and transmission for i.i.d. Gaussian biometric sources are determined. These results are the Gaussian equivalents of the results that were obtained for the discrete case by the authors and independently by Lai et al. in 2008. Also the effect that binary quantization of the biometric sequences has on the ratio of the secret-key rate and privacy-leakage rate is considered. It is shown that the squared correlation coefficient must be increased by a factor of pi2/4 to compensate for such a quantization action, for values of the privacy-leakage rate that approach zero, when the correlation coefficient is close to zero

    Privacy leakage in biometric secrecy systems

    Full text link
    Motivated by Maurer [1993], Ahlswede and Csiszar [1993] introduced the concept of secret sharing. In their source model two terminals observe two correlated sequences. It is the objective of both terminals to form a common secret by interchanging a public message (helper data), that should contain only a negligible amount of information about the secret. Ahlswede and Csiszar showed that the maximum secret key rate that can be achieved in this way is equal to the mutual information between the two source outputs. In a biometric setting, where the sequences correspond to the enrollment and authentication data, it is crucial that the public message leaks as little information as possible about the biometric data, since compromised biometric data cannot be replaced. We investigate the fundamental trade-offs for four biometric settings. The first one is the standard (Ahlswede-Csiszar) secret generation setting, for which we determine the secret key rate - privacy leakage region. Here leakage corresponds to the mutual information between helper data and biometric enrollment sequence conditional on the secret. In the second setting the secret is not generated by the terminals but independently chosen, and transmitted using a public message. Again we determine the region of achievable rate - leakage pairs. In setting three and four we consider zero-leakage, i.e. the public message contains only a negligible amount of information about the secret and the biometric enrollment sequence. To achieve this a private key is needed which can be observed only by the terminals. We consider again both secret generation and secret transmission and determine for both cases the region of achievable secret key rate - private key rate pairs
    • …
    corecore